wget command
wget
- The non-interactive network downloader.
The wget
command in Linux is a powerful tool used to download files from the web via HTTP, HTTPS, or FTP. It’s non-interactive, making it ideal for scripts or downloading files directly in the terminal.
Usage: wget [OPTION]... [URL]...
URL
: The web address of the file to download.OPTION
: Flags to modify behavior (e.g., output file, retries).
Common Options
Option | Description |
---|---|
-O | Specify output filename |
-P | Specify output directory |
-c | Resume a partial download |
-i | Download from a list of URLs |
--limit-rate | Limit download speed (e.g., 200k ) |
-r | Recursive download |
-l | Limit recursive depth |
-b | Run in background |
-q | Quiet mode (no output) |
Examples
-
Basic Download
Download a file from a URL to the current directory.
wget https://example.com/file.txt
- Downloads
file.txt
to your current directory. - Shows a progress bar with speed and size.
- Downloads
-
Specifying Output File
Use
-O
(uppercase) to save the file with a specific name.wget -O myfile.txt https://example.com/file.txt
- Saves as
myfile.txt
instead of the default name.
- Saves as
-
Downloading to a Directory
Use
-P
to specify a directory for the downloaded file.wget -P /home/user/downloads https://example.com/file.txt
- Saves
file.txt
to/home/user/downloads
.
- Saves
-
Resuming Downloads
Use
-c
to resume a partially downloaded file.wget -c https://example.com/largefile.zip
- Continues downloading
largefile.zip
if interrupted (e.g., byCtrl+C
).
- Continues downloading
-
Downloading Multiple Files
Pass multiple URLs or use a file with URLs via
-i
.Example (Multiple URLs):
wget https://example.com/file1.txt https://example.com/file2.txt
Example (From File):
Create
urls.txt
:https://example.com/file1.txt
https://example.com/file2.txtThen:
wget -i urls.txt
- Downloads all listed files.
-
Limiting Download Speed
Use
--limit-rate
to cap bandwidth usage.wget --limit-rate=200k https://example.com/largefile.zip
- Limits speed to 200 KB/s (
k
= KB,m
= MB).
- Limits speed to 200 KB/s (
-
Recursive Download
Use
-r
to download a website or directory recursively.wget -r https://example.com/docs/
- Downloads
docs/
and its contents (e.g., HTML, images). - Use
-l
to limit depth:wget -r -l 2 https://example.com/docs/
- Goes 2 levels deep.
- Downloads
-
Background Download
Use
-b
to runwget
in the background.wget -b https://example.com/largefile.zip
- Output goes to
wget-log
; check progress withtail -f wget-log
.
- Output goes to
To get help related to the wget
command use --help
option
For more details, check the manual with man wget